# Chinese Pre-training
ERNIE 4.5 0.3B PT Bf16
Apache-2.0
ERNIE-4.5-0.3B-PT-bf16 is a version of the ERNIE series models developed by Baidu, with a parameter scale of 0.3B and trained using bf16 precision.
Large Language Model Supports Multiple Languages
E
mlx-community
214
1
Rbt4 H312
Apache-2.0
MiniRBT is a Chinese small pre-trained model developed based on knowledge distillation technology, optimized for training efficiency using Whole Word Masking.
Large Language Model
Transformers Chinese

R
hfl
34
5
Minirbt H256
Apache-2.0
MiniRBT is a small Chinese pre-trained model based on knowledge distillation technology, combined with whole word masking, suitable for various Chinese natural language processing tasks.
Large Language Model
Transformers Chinese

M
hfl
225
7
Structbert Large Zh
StructBERT is a novel model that extends BERT by incorporating linguistic structures into the pre-training process, leveraging two auxiliary tasks to fully utilize word and sentence order structures
Large Language Model
Transformers Chinese

S
junnyu
77
8
Roberta Base Word Chinese Cluecorpussmall
A Chinese tokenized version of the RoBERTa medium model pre-trained on CLUECorpusSmall corpus, with tokenization processing to enhance sequence handling efficiency
Large Language Model Chinese
R
uer
184
9
Chinese Pert Base
PERT is a Chinese pre-trained model based on BERT, focusing on improving Chinese text processing capabilities.
Large Language Model
Transformers Chinese

C
hfl
131
13
Chinese Electra Base Generator
Apache-2.0
Chinese ELECTRA is a pre-trained model developed by the Harbin Institute of Technology-iFLYTEK Joint Lab (HFL) based on the ELECTRA model released by Google and Stanford University. It features a small parameter size and high performance.
Large Language Model
Transformers Chinese

C
hfl
15
0
Chinese Electra Small Generator
Apache-2.0
Chinese ELECTRA is a pre-trained model developed by the Harbin Institute of Technology-iFLYTEK Joint Lab based on Google's ELECTRA architecture, with only 1/10 the parameters of BERT but comparable performance.
Large Language Model
Transformers Chinese

C
hfl
16
0
Chinesebert Large
ChineseBERT is a Chinese pre-training model that integrates glyph and pinyin information, enhancing Chinese comprehension through improved glyph features
Large Language Model
Transformers Chinese

C
junnyu
21
0
Roberta Tiny Word Chinese Cluecorpussmall
A Chinese word-based RoBERTa medium model pre-trained on CLUECorpusSmall, featuring an 8-layer 512-hidden architecture with superior performance and faster processing speed compared to character-based models
Large Language Model Chinese
R
uer
17
3
Featured Recommended AI Models